Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase upstream response caching #674

Merged
merged 6 commits into from Feb 22, 2017
Merged

Increase upstream response caching #674

merged 6 commits into from Feb 22, 2017

Conversation

minrk
Copy link
Member

@minrk minrk commented Feb 22, 2017

  • Stop expiring cache of upstream responses. Since we still make a follow-up request, there is no cost in caching these as long as possible. Memcache will cleanup the oldest entries if it starts to fill up.
  • When the follow-up request fails, re-use the cached response instead of failing.
  • Reduce load impact of bots on load by always using a cached response if there is one. saving this for a later PR, since it's not as simple as I thought to tell what incoming request an outgoing request is on behalf of.

The only harm in this being high is cache size,
because we still send a new request and check for 304, etc.
Allows us to show stale results when we hit rate limits, or when upstream servers are down.
use cached responses if available, without checking for updates
instead of expiring them.

Let the cache prune itself.
it's a bit harder than it looks because the client doesn't have a handle on the parent request
@minrk minrk changed the title [WIP] Increase upstream response caching Increase upstream response caching Feb 22, 2017
the rate limit will be 0 for the last successful request with status 200
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant